The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Seismic shothole drillers’ logs, record the near-surface (avg. 18.6m deep) lithostratigraphy encountered when drilling holes to place explosive charges. These records offer a largely unrecognized wealth of geoscience information in areas for which little may be otherwise known. Stored in the Basic Files archives of petroleum exploration and seismic acquisition companies, this study first convinced...
Seismologists from Kazakhstan, Russia, and the United States have rescued the Soviet-era archive of nuclear explosion seismograms recorded at Borovoye in northern Kazakhstan during the period 1966–1996. The signals had been stored on about 8000 magnetic tapes, which were held at the recording observatory. After hundreds of man-years of work, these digital waveforms together with significant metadata...
Myriad environmental satellite missions are currently orbiting the earth. The comprehensive monitoring by these sensors provide scientists, policymakers, and the public critical information on the earth’s weather and climate system. The state of the art technology of our satellite monitoring system is the legacy of the first environment satellites, the Nimbus systems launched by NASA in the mid-1960s...
Geological models, as structural representations of the subsurface, are increasingly used for regional scale geological analyses and research studies. In this context, it is often essential to use geological legacy data, for example in the form of printed well logs, seismic sections, or maps and interpreted cross-sections from previous reports. A problem when using this type of data is that standard...
Over the course of a scientific career, a large fraction of the data collected by scientific investigators turns into data at risk of becoming inaccessible to future science. Although a part of the investigators’ data is made available in manuscripts and databases, other data may remain unpublished, non-digital, on degrading or near obsolete digital media, or inadequately documented for reuse. In...
The youth of seismology as a science, compared to the typical duration of seismic cycles, results in a relative scarcity of records of large earthquakes available for processing by modern analytical techniques, which in turn makes archived datasets of historical seismograms extremely valuable in order to enhance our understanding of the occurrence of large, destructive earthquakes. Unfortunately,...
Historical bedrock field observations have potential for significant value to the scientific community and the public if they can be rescued from physical records stored in archives of scientific research institutions. A set of historical records from ‘Operation Norman’, a bedrock mapping activity conducted in northwestern Canada by the Geological Survey of Canada (GSC) from 1968 to 1970, was identified...
Scientific ocean drilling began in 1968 and ever since has been generating huge amounts of data, including that from shipboard analysis of cores, in situ borehole measurements, long-term subseafloor hydrogeological observatories, and post-expedition research done on core samples and data at laboratories around the world (Smith et al., 2010). Much of the data collected aboard the drilling vessels are...
Adopting standards for data and metadata collection is necessary for success of data rescue and preservation initiatives. Physical sample data and metadata rescue and preservation can be particularly challenging in that much of the available information may not be readily digitized or machine-readable. Making the legacy data rescue and preservation process as simple as possible through the development...
Data generated as a result of publicly funded research in the USA and other countries are now required to be available in public data repositories. However, many scientific data over the past 50+ years were collected at a time when the technology for curation, storage, and dissemination were primitive or non-existent and consequently many of these datasets are not available publicly. These so-called...
The Global Sea Level Observing System (GLOSS) Group of Experts (GE) data archaeology group is collating tools and producing guidelines for historic sea level data. They aim to aid the discovery, scanning, digitising and quality control of analogue tide gauge charts and sea level ledgers. Their goal is to improve the quality, quantity and availability of long-term sea level data series. This paper...
What is the value of ‘old’ data when much more sophisticated data are being acquired today in huge quantities with modern equipment and served up in ready-to-use form? Why the hype over delving into the past, when the observers were undoubtedly less well informed than they are today? What can such old records possibly teach us that we don’t already know better from modern electronic data and today’s...
No central database or repository is currently available in the USA to preserve long-term, spatially extensive records of fluvial geomorphic data or to provide future accessibility. Yet, because of their length and continuity these data are valuable for future research. Therefore, we built a public accessible website to preserve data records of two examples of long-term monitoring (40 and 18years)...
This paper considers legacy data and data rescue within the context of geomorphology. Data rescue may be necessary dependent upon the storage medium (is it physically accessible) and the data format (e.g. digital file type); where either of these is not functional, intervention will be required in order to retrieve the stored data. Within geomorphological research, there are three scenarios that may...
Amino acid racemization (AAR) dating methods have been used since the mid-1960s. Since that time, information technologies have evolved as AAR laboratories have worked to appropriately catalog sample collections and analyses. The University of Delaware AAR Database (UDAARDB) is a database of AAR and other geochronological data from coastal Quaternary sites in North and South America that has been...
Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide...
Earth Observation data acquired by the Landsat missions are of immense value to the global community and constitute the world’s longest continuous civilian Earth Observation program. However, because of the costs of data storage infrastructure these data have traditionally been stored in raw form on tape storage infrastructures which introduces a data retrieval and processing overhead that limits...
On timescales beyond the life of a research project, a core task in the curation of digital research data is the migration of data and metadata to new storage media, new hardware, and software systems. These migrations are necessitated by ageing software systems, ageing hardware systems, and the rise of new technologies in data management. Using the example of the German Continental Deep Drilling...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.